On the performance of novice evaluators in usability evaluations

نویسندگان

  • Panayiotis Koutsabasis
  • Thomas Spyrou
  • Jenny S. Darzentas
  • John Darzentas
چکیده

The paper investigates the performance of novice evaluators in usability evaluations by presenting the results of a comparative usability evaluation that was conducted by nine novice evaluator teams. The evaluation teams performed considerably well in terms of the validity of their results, which counts for their participation in usability evaluation projects. The thoroughness of the results obtained was found to be in a relatively stable ratio of about 20-25% of the total number of problems found for eight of the nine teams, which gives a clear indication of the degree to which novice evaluators can identify usability problems. The consistency of the results was not satisfactory, although similar to results of other studies that involve professional evaluators. The paper suggests that when novice evaluators have to be employed for usability evaluations and it is important to find most usability problems then parallel usability evaluations can provide overall valid and thorough results.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The Evaluator Effect: A Chilling Fact About Usability Evaluation Methods

Computer professionals have a need for robust, easy-to-use usability evaluation methods (UEMs) to help them systematically improve the usability of computer artefacts. However, cognitive walkthrough, heuristic evaluation, and thinking-aloud studies – three of the most widely used UEMs – suffer from a substantial evaluator effect in that multiple evaluators evaluating the same interface with the...

متن کامل

Group vs Individual Web Accessibility Evaluations: Effects with Novice Evaluators

s on Human Factors in Computing Systems, pages 662–663. ACM, ACM Press, 2002. G.W. Hill. Group versus individual performance: Are N+1 heads better than one? Psychological Bulletin, 91(3):517–

متن کامل

A Web-Based Evaluation Framework for Supporting Novice and Expert Evaluators of Adaptive E- Learning Systems

Based upon the analysed results of an evidence based study conducted over a period of four years, it is clear that it is difficult to identify, for a given evaluation objective, the range of appropriate evaluation techniques (methods, metrics, criteria and approaches to be used). This paper presents a Web-based evaluation framework which is freely available online, designed to support both novi...

متن کامل

Investigating the usability of an Integrated Research Automation System (SEAT): Heuristic Evaluation

Background and Objectives: Today, many hardware and software products, including office automation software, and web-based websites are used by employees, including professors and employees of different departments in offices. Websites are considered one of the main aspects of competition in any organization. This study aims to investigate the usability of the Integrated Research Automation Sys...

متن کامل

Do patterns help novice evaluators? A comparative study

Evaluating e-learning systems is a complex activity which requires considerations of several criteria addressing quality in use as well as educational quality. Heuristic evaluation is a widespread method for usability evaluation, yet its output is often prone to subjective variability, primarily due to the generality of many heuristics. This paper presents the Pattern-Based (PB) inspection, whi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007